Convergence results for projected line-search methods on varieties of low-rank matrices via \L{}ojasiewicz inequality
نویسندگان
چکیده
As an intial step towards low-rank optimization algorithms using hierarchical tensors, the aim of this paper is to derive convergence results for projected line-search methods on the real-algebraic variety M≤k of real m × n matrices of rank at most k. Such methods extend successfully used Riemannian optimization methods on the smooth manifold Mk of rank-k matrices to its closure by taking steps along gradient-related directions in the tangent cone, and afterwards projecting back to M≤k. Considering such a method circumvents the difficulties which arise from the non-closedness and the unbounded curvature of Mk. The point-wise convergence is obtained for real-analytic functions on the basis of a Lojasiewicz inequality for the projection of the negative gradient to the tangent cone. If the derived limit point lies on the smooth part of M≤k, i.e. in Mk, this boils down to more or less known results, but with the benefit that asymptotic convergence rate estimates (for specific step-sizes) can be obtained without an a-priori curvature bound, simply from the fact that the limit lies on a smooth manifold. At the same time, one can give a convincing justification for assuming critical points to lie in Mk: if X is a crtitical point of f on M≤k, then either X has rank k, or ∇f(X) = 0.
منابع مشابه
Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality
The aim of this paper is to derive convergence results for projected line-search methods on the real-algebraic variety M≤k of real m× n matrices of rank at most k. Such methods extend Riemannian optimization methods, which are successfully used on the smooth manifold Mk of rank-k matrices, to its closure by taking steps along gradient-related directions in the tangent cone, and afterwards proje...
متن کاملLinear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-\L{}ojasiewicz Condition
In 1963, Polyak proposed a simple condition that is sufficient to show a global linear convergence rate for gradient descent. This condition is a special case of the Lojasiewicz inequality proposed in the same year, and it does not require strong convexity (or even convexity). In this work, we show that this much-older PolyakLojasiewicz (PL) inequality is actually weaker than the main condition...
متن کاملSome rank equalities for finitely many tripotent matrices
A rank equality is established for the sum of finitely many tripotent matrices via elementary block matrix operations. Moreover, by using this equality and Theorems 8 and 10 in [Chen M. and et al. On the open problem related to rank equalities for the sum of finitely many idempotent matrices and its applications, The Scientific World Journal 2014 (2014), Article ID 702413, 7 page...
متن کاملLinear Regression under Fixed-Rank Constraints: A Riemannian Approach
In this paper, we tackle the problem of learning a linear regression model whose parameter is a fixed-rank matrix. We study the Riemannian manifold geometry of the set of fixed-rank matrices and develop efficient line-search algorithms. The proposed algorithms have many applications, scale to highdimensional problems, enjoy local convergence properties and confer a geometric basis to recent con...
متن کاملA Nonconvex Free Lunch for Low-Rank plus Sparse Matrix Recovery
We study the problem of low-rank plus sparse matrix recovery. We propose a generic and efficient nonconvex optimization algorithm based on projected gradient descent and double thresholding operator, with much lower computational complexity. Compared with existing convex-relaxation based methods, the proposed algorithm recovers the low-rank plus sparse matrices for free, without incurring any a...
متن کامل